Convergence properties and data efficiency of the minimum error entropy criterion in ADALINE training
نویسندگان
چکیده
Recently, we have proposed the minimum error entropy (MEE) criterion as an information theoretic alternative to the widely used mean square error criterion in supervised adaptive system training. For this purpose, we have formulated a nonparametric estimator for Renyi’s entropy that employs Parzen windowing. Mathematical investigation of the proposed entropy estimator revealed interesting insights about the process of information theoretical learning. This new estimator and the associated criteria have been applied to the supervised and unsupervised training of adaptive systems in a wide range of problems successfully. In this paper, we analyze the structure of the MEE performance surface around the optimal solution, and we derive the upper bound for the step size in adaptive linear neuron (ADALINE) training with the steepest descent algorithm using MEE. In addition, the effects of the entropy order and the kernel size in Parzen windowing on the shape of the performance surface and the eigenvalues of the Hessian at and around the optimal solution are investigated. Conclusions from the theoretical analyses are illustrated through numerical examples.
منابع مشابه
Convergence Analysis of the Information Potential Criterion in Adaline Training
In our recent studies we have proposed the use of minimum error entropy criterion as an alternative to minimum square error (MSE) in supervised adaptive system training. We have formulated a nonparametric estimator for Renyi’s entropy with the help of Parzen windowing. This formulation revealed interesting insights about the process of information theoretical learning. We have applied this new ...
متن کاملEntropy minimization for supervised digital communications channel equalization
This paper investigates the application of error-entropy minimization algorithms to digital communications channel equalization. The pdf of the error between the training sequence and the output of the equalizer is estimated using the Parzen windowing method with a Gaussian kernel, and then, the Renyi’s quadratic entropy is minimized using a gradient descent algorithm. By estimating the Renyis ...
متن کاملConvergence of a Fixed-Point Minimum Error Entropy Algorithm
The minimum error entropy (MEE) criterion is an important learning criterion in information theoretical learning (ITL). However, the MEE solution cannot be obtained in closed form even for a simple linear regression problem, and one has to search it, usually, in an iterative manner. The fixed-point iteration is an efficient way to solve the MEE solution. In this work, we study a fixed-point MEE...
متن کاملA minimum-error entropy criterion with self-adjusting step-size (MEE-SAS)
In this paper, we propose a minimum-error entropy with self-adjusting step -size (MEE-SAS) as an alternative to the minimum-error entropy (MEE) algorithm for training adaptive systems. MEE-SAS has faster speed of convergence as compared to MEE algorithm for the same misadjustment. We attribute the self-adjusting step-size property of MEE-SAS to its changing curvature as opposed to MEE which has...
متن کاملProportionate Minimum Error Entropy Algorithm for Sparse System Identification
Sparse system identification has received a great deal of attention due to its broad applicability. The proportionate normalized least mean square (PNLMS) algorithm, as a popular tool, achieves excellent performance for sparse system identification. In previous studies, most of the cost functions used in proportionate-type sparse adaptive algorithms are based on the mean square error (MSE) crit...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- IEEE Trans. Signal Processing
دوره 51 شماره
صفحات -
تاریخ انتشار 2003